skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Endert, Alex"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available July 4, 2026
  2. Free, publicly-accessible full text available March 24, 2026
  3. Data visualization literacy is essential for K-12 students, yet existing practices emphasize interpreting pre-made visualizations rather than creating them. To address this, we developed the DPV (Domain, Purpose, Visual) framework, which guides middle school students through the visualization design process. The framework simplifies design into three stages: understanding the problem domain, specifying the communication purpose, and translating data into effective visuals. Implemented in a twoweek summer camp as a usage scenario, the DPV framework enabled students to create visualizations addressing community issues. Evaluation of student artifacts, focus group interviews, and surveys demonstrated its effectiveness in enhancing students' design skills and understanding of visualization concepts. This work highlights the DPV framework's potential to foster data visualization literacy for K-12 education and broaden participation in the data visualization community. 
    more » « less
    Free, publicly-accessible full text available January 1, 2026
  4. Free, publicly-accessible full text available January 1, 2026
  5. Free, publicly-accessible full text available January 1, 2026
  6. Many papers make claims about specific visualization techniques that are said to enhance or calibrate trust in AI systems. But a design choice that enhances trust in some cases appears to damage it in others. In this paper, we explore this inherent duality through an analogy with “knobs”. Turning a knob too far in one direction may result in under-trust, too far in the other, over-trust or, turned up further still, in a confusing distortion. While the designs or so-called “knobs” are not inherently evil, they can be misused or used in an adversarial context and thereby manipulated to mislead users or promote unwarranted levels of trust in AI systems. When a visualization that has no meaningful connection with the underlying model or data is employed to enhance trust, we refer to the result as “trust junk.” From a review of 65 papers, we identify nine commonly made claims about trust calibration. We synthesize them into a framework of knobs that can be used for good or “evil,” and distill our findings into observed pitfalls for the responsible design of human-AI systems. 
    more » « less